Storage of Natural Language Sentences in a Hopfield Network

نویسنده

  • Nigel Collier
چکیده

This paper look at how the Hopfield neural network can be used to store and recall patterns constructed from natural language sentences. As a pattern recognition and storage tool, the Hopfield neural network has received much attention. This attention however has been mainly in the field of statistical physics due to the model’s simple abstraction of spin glass systems. A discussion is made of the differences, shown as bias and correlation, between natural language sentence patterns and the randomly generated ones used in previous experiments. Results are given for numerical simulations which show the auto-associative competence of the network when trained with natural language patterns.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sentence Recognition Using Hopfield Neural Network

Communication in natural languages between computational systems and humans is an area that has attracted researchers for long. This type of communication can have wide ramification as such a system could find wide usage in several areas. WebBrowsing via input given as textual commands/sentences in natural languages is one such area. However, the enormous amount of input that could be given in ...

متن کامل

تصحیح خودکار غلط های تایپی فارسی به کمک شبکه عصبی مصنوعی ترکیبی

Automatic correction of typos in the typed texts is one of the goals of research in artificial intelligence, data mining and natural language processing. Most of the existing methods are based on searching in dictionaries and determining the similarity of the dictionary entries and the given word. This paper presents the design, implementation, and evaluation of a Farsi typo correction system u...

متن کامل

Convergence Time Characteristics of an Associative Memory for Natural Language Processing

We take a new look at one of the fundamental properties of discrete time associative memory and show how it can be adapted for natural language processing (NLP). Many tasks in NLP could benefit from such associative functionality particularly those which are traditionally regarded as being context driven such as word sense dis-ambiguation. The results describe the typical time to convergence of...

متن کامل

Textual Energy of Associative Memories: Performant Applications of Enertex Algorithm in Text Summarization and Topic Segmentation

Hopfield [1, 2] took as a starting point physical systems like the magnetic Ising model �formalism resulting from statistical physics describing a system composed of units with two possible states named spins) to build a Neural Network �NN) with abilities of learning and recovery of patterns. The capacities and limitations of this Network, called associative memory, were well established in a t...

متن کامل

A neuronal basis for the fan effect

The fan effect says that “activation” spreading from a concept is divided among the concepts it spreads to. Because this activation is not a physical entity, but an abstraction of unknown lower-level processes, the spreadingactivation model has predictive but not explanatory power. We provide one explanation of the fan effect by showing that distributed neuronal memory networks (specifically, H...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره cmp-lg/9608001  شماره 

صفحات  -

تاریخ انتشار 1996